903 research outputs found

    The Impact of Individual Learning on Electronic Health Record Routinization: An Empirical Study

    Get PDF
    Since the passage of the HITECH Act, adoption of electronic health records (EHR) has increased significantly EHR refers to an electronic version of a patient’s medical history. The adoption of EHR has potential to reduce medical errors, duplication of testing, and delays in treatment. However, current literature indicates that implementation of EHR is not resulting in the automatic routinization of EHR. Routinization refers to the notion that truly successful technological innovations are no longer perceived as being new or out-of-the-ordinary. The complexity of EHRs allow individual users to use these systems at different levels of sophistication. Research shows that healthcare professionals are using non-standard ways to use or circumvent the EHR to complete their work and are limited in EHR systems use. Further, although workarounds may seem necessary to physicians and are not perceived to be problematic, they can pose a threat to patient safety and hinder the potential benefits. Hence, we argue the EHR implementations are limited in their potential due to the lack of routinization. Any new technological innovation requires the physician support and willingness to learn about the system to move to the routinization phase of implementation. Hence, we draw from the literature on organization learning, individual learning, and routines to understand factors that influence EHR routinization

    The Impact of Individual Learning on Electronic Health Record Routinization: An Empirical Study

    Get PDF
    Since the passage of the HITECH Act, adoption of electronic health records (EHR) has increased significantly EHR refers to an electronic version of a patient’s medical history. The adoption of EHR has potential to reduce medical errors, duplication of testing, and delays in treatment. However, current literature indicates that implementation of EHR is not resulting in the automatic routinization of EHR. Routinization refers to the notion that truly successful technological innovations are no longer perceived as being new or out-of-the-ordinary. The complexity of EHRs allow individual users to use these systems at different levels of sophistication. Research shows that healthcare professionals are using non-standard ways to use or circumvent the EHR to complete their work and are limited in EHR systems use. Further, although workarounds may seem necessary to physicians and are not perceived to be problematic, they can pose a threat to patient safety and hinder the potential benefits. Hence, we argue the EHR implementations are limited in their potential due to the lack of routinization. Any new technological innovation requires the physician support and willingness to learn about the system to move to the routinization phase of implementation. Hence, we draw from the literature on organization learning, individual learning, and routines to understand factors that influence EHR routinization

    Abstracts from the NIHR INVOLVE Conference 2017

    Get PDF
    n/

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (μ̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ¯ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ¯ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),μ̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| < 0.03 at 95% confidence level. [Figure not available: see fulltext.

    Bose-Einstein correlations of charged hadrons in proton-proton collisions at s\sqrt s = 13 TeV

    Get PDF
    Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s \sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s \sqrt{s} = 7 TeV, as well as with theoretical predictions.[graphic not available: see fulltext]Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s=\sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s=\sqrt{s} = 7 TeV, as well as with theoretical predictions

    Search for dark matter in events with a leptoquark and missing transverse momentum in proton-proton collisions at 13 TeV

    Get PDF
    A search is presented for dark matter in proton-proton collisions at a center-of-mass energy of root s= 13 TeV using events with at least one high transverse momentum (p(T)) muon, at least one high-p(T) jet, and large missing transverse momentum. The data were collected with the CMS detector at the CERN LHC in 2016 and 2017, and correspond to an integrated luminosity of 77.4 fb(-1). In the examined scenario, a pair of scalar leptoquarks is assumed to be produced. One leptoquark decays to a muon and a jet while the other decays to dark matter and low-p(T) standard model particles. The signature for signal events would be significant missing transverse momentum from the dark matter in conjunction with a peak at the leptoquark mass in the invariant mass distribution of the highest p(T) muon and jet. The data are observed to be consistent with the background predicted by the standard model. For the first benchmark scenario considered, dark matter masses up to 500 GeV are excluded for leptoquark masses m(LQ) approximate to 1400 GeV, and up to 300 GeV for m(LQ) approximate to 1500 GeV. For the second benchmark scenario, dark matter masses up to 600 GeV are excluded for m(LQ) approximate to 1400 GeV. (C) 2019 The Author(s). Published by Elsevier B.V.Peer reviewe

    Search for an L-mu - L-tau gauge boson using Z -> 4 mu events in proton-proton collisions at root s=13 TeV

    Get PDF
    A search for a narrow Z' gauge boson with a mass between 5 and 70 GeV resulting from an L-mu - L-tau U (1) local gauge symmetry is reported. Theories that predict such a particle have been proposed as an explanation of various experimental discrepancies, including the lack of a dark matter signal in direct-detection experiments, tension in the measurement of the anomalous magnetic moment of the muon, and reports of possible lepton flavor universality violation in B meson decays. A data sample of proton-proton collisions at a center-of-mass energy of 13 TeV is used, corresponding to an integrated luminosity of 77.3 fb(-1) recorded in 2016 and 2017 by the CMS detector at the LHC. Events containing four muons with an invariant mass near the standard model Z boson mass are analyzed, and the selection is further optimized to be sensitive to the events that may contain Z -> Z'mu mu -> 4 mu decays. The event yields are consistent with the standard model predictions. Upper limits of 10(-8)-10(-7) at 95% confidence level are set on the product of branching fractions B(Z -> Z'mu mu)B(Z' -> mu mu), depending on the Z' mass, which excludes a Z' boson coupling strength to muons above 0.004-0.3. These are the first dedicated limits on L-mu - L-tau models at the LHC and result in a significant increase in the excluded model parameter space. The results of this search may also be used to constrain the coupling strength of any light Z' gauge boson to muons. (C) 2019 The Author(s). Published by Elsevier B.V.Peer reviewe
    corecore